Scalable computations for nonstationary Gaussian processes
نویسندگان
چکیده
Nonstationary Gaussian process models can capture complex spatially varying dependence structures in spatial data. However, the large number of observations modern datasets makes fitting such computationally intractable with conventional dense linear algebra. In addition, derivative-free or even first-order optimization methods be slow to converge when estimating many parameters. We present here a computational framework that couples an algebraic block-diagonal plus low-rank covariance matrix approximation stochastic trace estimation facilitate efficient use second-order solvers for maximum likelihood demonstrate effectiveness these by simultaneously 192 parameters popular nonstationary model Paciorek and Schervish using 107,600 sea surface temperature anomaly measurements.
منابع مشابه
Scalable Gaussian Processes for Supervised Hashing
We propose a flexible procedure for large-scale image search by hash functions with kernels. Our method treats binary codes and pairwise semantic similarity as latent and observed variables, respectively, in a probabilistic model based on Gaussian processes for binary classification. We present an efficient inference algorithm with the sparse pseudo-input Gaussian process (SPGP) model and paral...
متن کاملProduct Kernel Interpolation for Scalable Gaussian Processes
Recent work shows that inference for Gaussian processes can be performed efficiently using iterative methods that rely only on matrix-vector multiplications (MVMs). Structured Kernel Interpolation (SKI) exploits these techniques by deriving approximate kernels with very fast MVMs. Unfortunately, such strategies suffer badly from the curse of dimensionality. We develop a new technique for MVM ba...
متن کاملThe Rate of Entropy for Gaussian Processes
In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...
متن کاملThoughts on Massively Scalable Gaussian Processes
We introduce a framework and early results for massively scalable Gaussian processes (MSGP), significantly extending the KISS-GP approach of Wilson and Nickisch (2015). The MSGP framework enables the use of Gaussian processes (GPs) on billions of datapoints, without requiring distributed inference, or severe assumptions. In particular, MSGP reduces the standard O(n) complexity of GP learning an...
متن کاملLocally-Biased Bayesian Optimization using Nonstationary Gaussian Processes
Bayesian optimization is becoming a fundamental global optimization algorithm in many applications where sample efficiency is needed, ranging from automatic machine learning, robotics, reinforcement learning, experimental design, simulations, etc. The most popular and effective Bayesian optimization method relies on a stationary Gaussian process as surrogate. In this paper, we present a novel n...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Statistics and Computing
سال: 2023
ISSN: ['0960-3174', '1573-1375']
DOI: https://doi.org/10.1007/s11222-023-10252-0